An Investigation of Newton-Sketch and Subsampled Newton Methods

نویسندگان

  • Albert S. Berahas
  • Raghu Bollapragada
  • Jorge Nocedal
چکیده

The concepts of sketching and subsampling have recently received much attention by the optimization and statistics communities. In this paper, we study NewtonSketch and Subsampled Newton (SSN) methods for the finite-sum optimization problem. We consider practical versions of the two methods in which the Newton equations are solved approximately using the conjugate gradient (CG) method or a stochastic gradient iteration. We establish new complexity results for the SSN-CG method that exploit the spectral properties of CG. Controlled numerical experiments compare the relative strengths of Newton-Sketch and SSN methods and show that for many finite-sum problems, they are far more efficient than SVRG, a popular first-order method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Subsampled Hessian Newton Methods for Supervised Learning

Newton methods can be applied in many supervised learning approaches. However, for large-scale data, the use of the whole Hessian matrix can be time-consuming. Recently, subsampled Newton methods have been proposed to reduce the computational time by using only a subset of data for calculating an approximation of the Hessian matrix. Unfortunately, we find that in some situations, the running sp...

متن کامل

Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence

We propose a randomized second-order method for optimization known as the Newton sketch: it is based on performing an approximate Newton step using a randomly projected Hessian. For self-concordant functions, we prove that the algorithm has superlinear convergence with exponentially high probability, with convergence and complexity guarantees that are independent of condition numbers and relate...

متن کامل

An efficient improvement of the Newton method for solving nonconvex optimization problems

‎Newton method is one of the most famous numerical methods among the line search‎ ‎methods to minimize functions. ‎It is well known that the search direction and step length play important roles ‎in this class of methods to solve optimization problems. ‎In this investigation‎, ‎a new modification of the Newton method to solve ‎unconstrained optimization problems is presented‎. ‎The significant ...

متن کامل

Exact and Inexact Subsampled Newton Methods for Optimization

The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an in...

متن کامل

A New High Order Closed Newton-Cotes Trigonometrically-fitted Formulae for the Numerical Solution of the Schrodinger Equation

In this paper, we investigate the connection between closed Newton-Cotes formulae, trigonometrically-fitted methods, symplectic integrators and efficient integration of the Schr¨odinger equation. The study of multistep symplectic integrators is very poor although in the last decades several one step symplectic integrators have been produced based on symplectic geometry (see the relevant lit...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1705.06211  شماره 

صفحات  -

تاریخ انتشار 2017